A Model Selection Criterion for Classification: Application to HMM Topology Optimization

نویسنده

  • Alain Biem
چکیده

This paper proposes a model selection criterion for classification problems. The criterion focuses on selecting models that are discriminant instead of models based on the Occam’s razor principle of parsimony between accurate modeling and complexity. The criterion, dubbed Discriminative Information Criterion (DIC), is applied to the optimization of Hidden Markov Model topology aimed at the recognition of cursively-handwritten digits. The results show that DICgenerated models achieve 18% relative improvement in performance from a baseline system generated by the Bayesian Information Criterion (BIC).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Pruning transitions in a hidden Markov model with optimal brain surgeon

This paper concerns about reducing the topology of a hidden Markov model (HMM) for a given task. The purpose is two-fold: (1) to select a good model topology with improved generalization capability; and/or (2) to reduce the model complexity so as to save memory and computation costs. The first goal falls into the active research area of model selection. From the model-theoretic research communi...

متن کامل

Information theoretic approaches to model selection

The p r imary p rob l em in l a rge vocabu l a ry conversational speech recognition (LVCSR) is poor acoustic-level matching due to large variability in pronunciations. There is much to explore about the “quality” of states in an HMM and the interrelationships between inter-state and intra-state Gaussians used to model speech. Of particular interest is the variable discriminating power of the in...

متن کامل

Negative Selection Based Data Classification with Flexible Boundaries

One of the most important artificial immune algorithms is negative selection algorithm, which is an anomaly detection and pattern recognition technique; however, recent research has shown the successful application of this algorithm in data classification. Most of the negative selection methods consider deterministic boundaries to distinguish between self and non-self-spaces. In this paper, two...

متن کامل

Improving Chernoff criterion for classification by using the filled function

Linear discriminant analysis is a well-known matrix-based dimensionality reduction method. It is a supervised feature extraction method used in two-class classification problems. However, it is incapable of dealing with data in which classes have unequal covariance matrices. Taking this issue, the Chernoff distance is an appropriate criterion to measure distances between distributions. In the p...

متن کامل

Restructuring output layers of deep neural networks using minimum risk parameter clustering

This paper attempts to optimize a topology of hidden Markov models (HMMs) for automatic speech recognition. Current state-of-the-art acoustic models for ASR involve HMMs with deep neural network (DNN)-based emission density functions. Even though DNN parameters are typically trained by optimizing a discriminative criterion, topology optimization of HMMs is usually performed by optimizing a gene...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003